12 research outputs found

    Source-Device-Independent Ultrafast Quantum Random Number Generation

    Get PDF
    Secure random numbers are a fundamental element of many applications in science, statistics, cryptography and more in general in security protocols. We present a method that enables the generation of high-speed unpredictable random numbers from the quadratures of an electromagnetic field without any assumption on the input state. The method allows us to eliminate the numbers that can be predicted due to the presence of classical and quantum side information. In particular, we introduce a procedure to estimate a bound on the conditional min-entropy based on the entropic uncertainty principle for position and momentum observables of infinite dimensional quantum systems. By the above method, we experimentally demonstrated the generation of secure true random bits at a rate greater than 1.7 Gbit/s

    Improving Quantum Key Distribution and Quantum Random Number Generation in presence of Noise

    Get PDF
    The argument of this thesis might be summed up as the exploitation of the noise to generate better noise. More specifically this work is about the possibility of exploiting classic noise to effectively transmit quantum information and measuring quantum noise to generate better quantum randomness. What do i mean by exploiting classical noise to transmit effectively quantum information? In this case I refer to the task of sending quantum bits through the atmosphere in order set up transmissions of quantum key distribution (QKD) and this will be the subject of Chapter 1 and Chapter 2. In the Quantum Communications framework, QKD represents a topic with challenging problems both theoretical and experimental. In principle QKD offers unconditional security, however practical realizations of it must face all the limitations of the real world. One of the main limitation are the losses introduced by real transmission channels. Losses cause errors and errors make the protocol less secure because an eavesdropper could try to hide his activity behind the losses. When this problem is addressed under a full theoretical point of view, one tries to model the effect of losses by means of unitary transforms which affect the qubits in average according a fixed level of link attenuation. However this approach is somehow limiting because if one has a high level of background noise and the losses are assumed in average constant, it could happen that the protocol might abort or not even start, being the predicted QBER to high. To address this problem and generate key when normally it would not be possible, we have proposed an adaptive real time selection (ARTS) scheme where transmissivity peaks are instantaneously detected. In fact, an additional resource may be introduced to estimate the link transmissivity in its intrinsic time scale with the use of an auxiliary classical laser beam co-propagating with the qubits but conveniently interleaved in time. In this way the link scintillation is monitored in real time and the selection of the time intervals of high channel transmissivity corresponding to a viable QBER for a positive key generation is made available. In Chapter 2 we present a demonstration of this protocol in conditions of losses equivalent to long distance and satellite links, and with a range of scintillation corresponding to moderate to severe weather. A useful criterion for the preselection of the low QBER interval is presented that employs a train of intense pulses propagating in the same path as the qubits, with parameters chosen such that its fluctuation in time reproduces that of the quantum communication. For what concern the content of Chapter 3 we describe a novel principle for true random number generator (TRNG) which is based on the observation that a coherent beam of light crossing a very long path with atmospheric turbulence may generate random and rapidly varying images. To implement our method in a proof of concept demonstrator, we have chosen a very long free space channel used in the last years for experiments in Quantum Communications at the Canary Islands. Here, after a propagation of 143 km at an altitude of the terminals of about 2400 m, the turbulence in the path is converted into a dynamical speckle at the receiver. The source of entropy is then the atmospheric turbulence. Indeed, for such a long path, a solution of the Navier-Stokes equations for the {atmospheric flow in which the beam propagates is out of reach. Several models are based on the Kolmogorov statistical theory, which parametrizes the repartition of kinetic energy as the interaction of decreasing size eddies. However, such models only provide a statistical description for the spot of the beam and its wandering and never an instantaneous prediction for the irradiance distribution. These are mainly ruled by temperature variations and by the wind and cause fluctuations in the air refractive index. For such reason, when a laser beam is sent across the atmosphere, this latter may be considered as a dynamic volumetric scatterer which distorts the beam wavefront. We will evaluate the experimental data to ensure that the images are uniform and independent. Moreover, we will assess that our method for the randomness extraction based on the combinatorial analysis is optimal in the context of Information Theory. In Chapter 5 we will present a new approach for what concerns the generation of random bits from quantum physical processes. Quantum Mechanics has been always regarded as a possible and valuable source of randomness, because of its intrinsic probabilistic Nature. However the typical paradigm is employed to extract random number from a quantum system it commonly assumes that the state of said system is pure. Such assumption, only in theory would lead to full and unpredictable randomness. The main issue however it is that in real implementations, such as in a laboratory or in some commercial device, it is hardly possible to forge a pure quantum state. One has then to deal with quantum state featuring some degree of mixedness. A mixed state however might be somehow correlated with some other system which is hold by an adversary, a quantum eavesdropper. In the extreme case of a full mixed state, practically one it is like if he is extracting random numbers from a classical state. In order to do that we will show how it is important to shift from a classical randomness estimator, such as the min-classical entropy H-min(Z) of a random variable Z to quantum ones such as the min-entropy conditioned on quantum side information E. We have devised an effective protocol based on the entropic uncertainty principle for the estimation of the min-conditional entropy. The entropic uncertainty principle lets one to take in account the information which is shared between multiple parties holding a multipartite quantum system and, more importantly, lets one to bound the information a party has on the system state after that it has been measured. We adapted such principle to the bipartite case where an user Alice, A, is supplied with a quantum system prepared by the provider Eve, E, who could be maliciously correlated to it. In principle then Eve might be able to predict all the outcomes of the measurements Alice performs on the basis Z in order to extract random numbers from the system. However we will show that if Alice randomly switches from the measurement basis to a basis X mutually unbiased to Z, she can lower bound the min entropy conditioned to the side information of Eve. In this way for Alice is possible to expand a small initial random seed in a much larger amount of trusted numbers. We present the results of an experimental demonstration of the protocol where random numbers passing the most rigorous classical tests of randomness were produced. In Chapter 6, we will provide a secure generation scheme for a continuos variable (CV) QRNG. Since random true random numbers are an invaluable resource for both the classical Information Technology and the uprising Quantum one, it is clear that to sustain the present and future even growing fluxes of data to encrypt it is necessary to devise quantum random number generators able to generate numbers in the rate of Gigabit or Terabit per second. In the Literature are given several examples of QRNG protocols which in theory could reach such limits. Typically, these are based on the exploitation of the quadratures of the electro-magnetic field, regarded as an infinite bosonic quantum system. The quadratures of the field can be measured with a well known measurement scheme, the so called homodyne detection scheme which, in principle, can yield an infinite band noise. Consequently the band of the random signal is limited only by the passband of the devices used to measure it. Photodiodes detectors work commonly in the GHz band, so if one sample the signal with an ADC enough fast, the Gigabit or Terabit rates can be easily reached. However, as in the case of discrete variable QRNG, the protocols that one can find in the Literature, do not properly consider the purity of the quantum state being measured. The idea has been to extend the discrete variable protocol of the previous Chapter, to the Continuous case. We will show how in the CV framework, not only the problem of the state purity is given but also the problem related to the precision of the measurements used to extract the randomness

    Adaptive real time selection for quantum key distribution in lossy and turbulent free-space channels

    Get PDF
    The unconditional security in the creation of cryptographic keys obtained by quantum key distribution (QKD) protocols will induce a quantum leap in free-space communication privacy in the same way that we are beginning to realize secure optical fiber connections. However, free-space channels, in particular those with long links and the presence of atmospheric turbulence, are affected by losses, fluctuating transmissivity, and background light that impair the conditions for secure QKD. Here we introduce a method to contrast the atmospheric turbulence in QKD experiments. Our adaptive real time selection (ARTS) technique at the receiver is based on the selection of the intervals with higher channel transmissivity. We demonstrate, using data from the Canary Island 143-km free-space link, that conditions with unacceptable average quantum bit error rate which would prevent the generation of a secure key can be used once parsed according to the instantaneous scintillation using the ARTS technique

    Extending Wheeler's delayed-choice experiment to Space

    Get PDF
    Gedankenexperiments have consistently played a major role in the development of quantum theory. A paradigmatic example is Wheeler's delayed-choice experiment, a wave-particle duality test that cannot be fully understood using only classical concepts. Here, we implement Wheeler's idea along a satellite-ground interferometer which extends for thousands of kilometers in Space. We exploit temporal and polarization degrees of freedom of photons reflected by a fast moving satellite equipped with retro-reflecting mirrors. We observed the complementary wave-like or particle-like behaviors at the ground station by choosing the measurement apparatus while the photons are propagating from the satellite to the ground. Our results confirm quantum mechanical predictions, demonstrating the need of the dual wave-particle interpretation, at this unprecedented scale. Our work paves the way for novel applications of quantum mechanics in Space links involving multiple photon degrees of freedom.Comment: 4 figure

    Lithium–Oxygen Battery Exploiting Highly Concentrated Glyme-Based Electrolytes

    Get PDF
    Concentrated solutions of lithium bis(trifluoromethanesulfonyl)imide (LiTFSI) and lithium nitrate (LiNO3) salts in either diethylene-glycol dimethyl-ether (DEGDME) or triethylene-glycol dimethyl-ether (TREGDME) are herein characterized in terms of chemical and electrochemical properties in view of possible applications as the electrolyte in lithium–oxygen batteries. X-ray photoelectron spectroscopy at the lithium metal surface upon prolonged storage in lithium cells reveals the complex composition and nature of the solid electrolyte interphase (SEI) formed through the reduction of the solutions, while thermogravimetric analysis shows a stability depending on the glyme chain length. The applicability of the solutions in the lithium metal cell is investigated by means of electrochemical impedance spectroscopy (EIS), chronoamperometry, galvanostatic cycling, and voltammetry, which reveal high conductivity and lithium transference number as well as a wide electrochemical stability window of both electrolytes. However, a challenging issue ascribed to the more pronounced evaporation of the electrolyte based on DEGDME with respect to TREGDME actually limits the application of the former in the Li/O2 battery. Hence, EIS measurements reveal a very fast increase in the impedance of cells using the DEGDME-based electrolyte upon prolonged exposure to the oxygen atmosphere, which leads to a performance decay of the corresponding Li/O2 battery. Instead, cells using the TREGDME-based electrolyte reveal remarkable interphase stability and much more enhanced response with specific capacity ranging from 500 to 1000 mA h g–1 referred to the carbon mass in the positive electrode, with an associated maximum practical energy density of 450 W h kg–1. These results suggest the glyme volatility as a determining factor for allowing the use of the electrolyte media in a Li/O2 cell. Therefore, electrolytes using a glyme with sufficiently high boiling point, such as TREGDME, which is further increased by the relevant presence of salts including a lithium protecting sacrificial one (LiNO3), can allow the application of the solutions in a safe and high-performance lithium–oxygen battery

    Characterizing Phase Noise in a Gain-Switched Laser Diode for Quantum Random-Number Generation

    Get PDF
    While operating a quantum random-number generator (QRNG), it is extremely useful to have a model of the physical entropy source to guarantee that the device is delivering randomness of genuine quantum origin. In this work we consider a QRNG based on a gain-switched laser diode and we develop a model to quantify its phase noise. This model is based on the laser rate equations and the state-of-the-art techniques for the characterization of laser diodes used in lightwave systems. These tools let us achieve a faithful modeling of the phase noise and we verify its accuracy through comparisons with experimental measurements. Furthermore, the model can be used to select optimal parameters to maximize the QRNG performance and monitor the device behavior to detect malfunctioning or malicious tampering of the device

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Abstract Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries

    Improving Quantum Key Distribution and Quantum Random Number Generation in presence of Noise

    Get PDF
    The argument of this thesis might be summed up as the exploitation of the noise to generate better noise. More specifically this work is about the possibility of exploiting classic noise to effectively transmit quantum information and measuring quantum noise to generate better quantum randomness. What do i mean by exploiting classical noise to transmit effectively quantum information? In this case I refer to the task of sending quantum bits through the atmosphere in order set up transmissions of quantum key distribution (QKD) and this will be the subject of Chapter 1 and Chapter 2. In the Quantum Communications framework, QKD represents a topic with challenging problems both theoretical and experimental. In principle QKD offers unconditional security, however practical realizations of it must face all the limitations of the real world. One of the main limitation are the losses introduced by real transmission channels. Losses cause errors and errors make the protocol less secure because an eavesdropper could try to hide his activity behind the losses. When this problem is addressed under a full theoretical point of view, one tries to model the effect of losses by means of unitary transforms which affect the qubits in average according a fixed level of link attenuation. However this approach is somehow limiting because if one has a high level of background noise and the losses are assumed in average constant, it could happen that the protocol might abort or not even start, being the predicted QBER to high. To address this problem and generate key when normally it would not be possible, we have proposed an adaptive real time selection (ARTS) scheme where transmissivity peaks are instantaneously detected. In fact, an additional resource may be introduced to estimate the link transmissivity in its intrinsic time scale with the use of an auxiliary classical laser beam co-propagating with the qubits but conveniently interleaved in time. In this way the link scintillation is monitored in real time and the selection of the time intervals of high channel transmissivity corresponding to a viable QBER for a positive key generation is made available. In Chapter 2 we present a demonstration of this protocol in conditions of losses equivalent to long distance and satellite links, and with a range of scintillation corresponding to moderate to severe weather. A useful criterion for the preselection of the low QBER interval is presented that employs a train of intense pulses propagating in the same path as the qubits, with parameters chosen such that its fluctuation in time reproduces that of the quantum communication. For what concern the content of Chapter 3 we describe a novel principle for true random number generator (TRNG) which is based on the observation that a coherent beam of light crossing a very long path with atmospheric turbulence may generate random and rapidly varying images. To implement our method in a proof of concept demonstrator, we have chosen a very long free space channel used in the last years for experiments in Quantum Communications at the Canary Islands. Here, after a propagation of 143 km at an altitude of the terminals of about 2400 m, the turbulence in the path is converted into a dynamical speckle at the receiver. The source of entropy is then the atmospheric turbulence. Indeed, for such a long path, a solution of the Navier-Stokes equations for the {atmospheric flow in which the beam propagates is out of reach. Several models are based on the Kolmogorov statistical theory, which parametrizes the repartition of kinetic energy as the interaction of decreasing size eddies. However, such models only provide a statistical description for the spot of the beam and its wandering and never an instantaneous prediction for the irradiance distribution. These are mainly ruled by temperature variations and by the wind and cause fluctuations in the air refractive index. For such reason, when a laser beam is sent across the atmosphere, this latter may be considered as a dynamic volumetric scatterer which distorts the beam wavefront. We will evaluate the experimental data to ensure that the images are uniform and independent. Moreover, we will assess that our method for the randomness extraction based on the combinatorial analysis is optimal in the context of Information Theory. In Chapter 5 we will present a new approach for what concerns the generation of random bits from quantum physical processes. Quantum Mechanics has been always regarded as a possible and valuable source of randomness, because of its intrinsic probabilistic Nature. However the typical paradigm is employed to extract random number from a quantum system it commonly assumes that the state of said system is pure. Such assumption, only in theory would lead to full and unpredictable randomness. The main issue however it is that in real implementations, such as in a laboratory or in some commercial device, it is hardly possible to forge a pure quantum state. One has then to deal with quantum state featuring some degree of mixedness. A mixed state however might be somehow correlated with some other system which is hold by an adversary, a quantum eavesdropper. In the extreme case of a full mixed state, practically one it is like if he is extracting random numbers from a classical state. In order to do that we will show how it is important to shift from a classical randomness estimator, such as the min-classical entropy H-min(Z) of a random variable Z to quantum ones such as the min-entropy conditioned on quantum side information E. We have devised an effective protocol based on the entropic uncertainty principle for the estimation of the min-conditional entropy. The entropic uncertainty principle lets one to take in account the information which is shared between multiple parties holding a multipartite quantum system and, more importantly, lets one to bound the information a party has on the system state after that it has been measured. We adapted such principle to the bipartite case where an user Alice, A, is supplied with a quantum system prepared by the provider Eve, E, who could be maliciously correlated to it. In principle then Eve might be able to predict all the outcomes of the measurements Alice performs on the basis Z in order to extract random numbers from the system. However we will show that if Alice randomly switches from the measurement basis to a basis X mutually unbiased to Z, she can lower bound the min entropy conditioned to the side information of Eve. In this way for Alice is possible to expand a small initial random seed in a much larger amount of trusted numbers. We present the results of an experimental demonstration of the protocol where random numbers passing the most rigorous classical tests of randomness were produced. In Chapter 6, we will provide a secure generation scheme for a continuos variable (CV) QRNG. Since random true random numbers are an invaluable resource for both the classical Information Technology and the uprising Quantum one, it is clear that to sustain the present and future even growing fluxes of data to encrypt it is necessary to devise quantum random number generators able to generate numbers in the rate of Gigabit or Terabit per second. In the Literature are given several examples of QRNG protocols which in theory could reach such limits. Typically, these are based on the exploitation of the quadratures of the electro-magnetic field, regarded as an infinite bosonic quantum system. The quadratures of the field can be measured with a well known measurement scheme, the so called homodyne detection scheme which, in principle, can yield an infinite band noise. Consequently the band of the random signal is limited only by the passband of the devices used to measure it. Photodiodes detectors work commonly in the GHz band, so if one sample the signal with an ADC enough fast, the Gigabit or Terabit rates can be easily reached. However, as in the case of discrete variable QRNG, the protocols that one can find in the Literature, do not properly consider the purity of the quantum state being measured. The idea has been to extend the discrete variable protocol of the previous Chapter, to the Continuous case. We will show how in the CV framework, not only the problem of the state purity is given but also the problem related to the precision of the measurements used to extract the randomness.L'argomento di questa tesi può essere riassunto nella frase utilizzare il rumore classico per generare un migliore rumore quantistico. In particolare questa tesi riguarda da una parte la possibilita di sfruttare il rumore classico per trasmettere in modo efficace informazione quantistica, e dall'altra la misurazione del rumore classico per generare una migliore casualita quantistica. Nel primo caso ci si riferisce all'inviare bit quantistici attraverso l'atmosfera per creare trasmissioni allo scopo di distribuire chiavi crittografiche in modo quantistico (QKD) e questo sara oggetto di Capitolo 1 e Capitolo 2. Nel quadro delle comunicazioni quantistiche, la QKD è caratterizzata da notevoli difficolta sperimentali. Infatti, in linea di principio la QKD offre sicurezza incondizionata ma le sue realizzazioni pratiche devono affrontare tutti i limiti del mondo reale. Uno dei limiti principali sono le perdite introdotte dai canali di trasmissione. Le perdite causano errori e gli errori rendono il protocollo meno sicuro perché un avversario potrebbe camuffare la sua attivita di intercettazione utilizzando le perdite. Quando questo problema viene affrontato da un punto di vista teorico, si cerca di modellare l'effetto delle perdite mediante trasformazioni unitarie che trasformano i qubits in media secondo un livello fisso di attenuazione del canale. Tuttavia questo approccio è in qualche modo limitante, perché se si ha ha un elevato livello di rumore di fondo e le perdite si assumono costanti in media, potrebbe accadere che il protocollo possa abortire o peggio ancora, non iniziare, essendo il quantum bit error rate (QBER) oltre il limite (11\%) per la distribuzione sicura. Tuttavia, studiando e caratterizzando un canale ottico libero, si trova che il livello di perdite è tutt'altro che stabile e che la turbolenza induce variazioni di trasmissivita che seguono una statistica log-normale. Il punto pertanto è sfruttare questo rumore classico per generare chiave anche quando normalmente non sarebbe possibile. Per far ciò abbiamo ideato uno schema adattativo per la selezione in tempo reale (ARTS) degli istanti a basse perdite in cui vengono istantaneamente rilevati picchi di alta trasmissivita. A tal scopo, si utilizza un fascio laser classico ausiliario co-propagantesi con i qubit ma convenientemente inframezzato nel tempo. In questo modo la scintillazione viene monitorata in tempo reale e vengono selezionati gli intervalli di tempo che daranno luogo ad un QBER praticabile per una generazione di chiavi. Verra quindi presentato un criterio utile per la preselezione dell'intervallo di QBER basso in cui un treno di impulsi intensi si propaga nello stesso percorso dei qubits, con i parametri scelti in modo tale che la sua oscillazione nel tempo riproduce quello della comunicazione quantistica. Nel Capitolo 2 presentiamo quindi una dimostrazione ed i risultati di tale protocollo che è stato implementato presso l'arcipelago delle Canarie, tra l'isola di La Palma e quella di Tenerife: tali isole essendo separate da 143 km, costituiscono un ottimo teatro per testare la validita del protocollo in quanto le condizioni di distanza sono paragonabili a quelle satellitari e la gamma di scintillazione corrisponde quella che si avrebbe in ambiente con moderato maltempo in uno scenario di tipo urbano. Per quanto riguarda il contenuto del Capitolo 3 descriveremo un metodo innovativo per la generazione fisica di numeri casuali che si basa sulla constatazione che un fascio di luce coerente, attraversando un lungo percorso con turbolenza atmosferica da luogo ad immagini casuali e rapidamente variabili. Tale fenomeno è stato riscontrato a partire dai diversi esperimenti di comunicazione quantistica effettuati alle Isole Canarie, dove il fascio laser classico utilizzato per puntare i terminali, in fase di ricezione presentava un fronte d'onda completamente distorto rispetto al tipico profilo gaussiano. In particolare ciò che si osserva è un insieme di macchie chiare e scure che si evolvono geometricamente in modo casuale, il cosiddetto profilo dinamico a speckle. La fonte di tale entropia è quindi la turbolenza atmosferica. Infatti, per un canale di tale lunghezza, una soluzione delle equazioni di Navier-Stokes per il flusso atmosferico in cui si propaga il fascio è completamente fuori portata, sia analiticamente che per mezzo di metodi computazionali. Infatti i vari modelli di dinamica atmosferica sono basati sulla teoria statistica Kolmogorov, che parametrizza la ripartizione dell'energia cinetica come l'interazione di vortici d'aria di dimensioni decrescenti. Tuttavia, tali modelli forniscono solo una descrizione statistica per lo spot del fascio e delle sue eventuali deviazioni ma mai una previsione istantanea per la distribuzione dell' irraggiamento. Per tale motivo, quando un raggio laser viene inviato attraverso l'atmosfera, quest'ultima può essere considerato come un diffusore volumetrico dinamico che distorce il fronte d'onda del fascio. All'interno del Capitolo verranno presentati i dati sperimentali che assicurano che le immagini del fascio presentano le caratteristiche di impredicibilita tali per cui sia possibile numeri casuali genuini. Inoltre, verra presentato anche il metodo per l'estrazione della casualita basato sull'analisi combinatoria ed ottimale nel contesto della Teoria dell'Informazione. In Capitolo 5 presenteremo un nuovo approccio per quanto riguarda la generazione di bit casuali dai processi fisici quantistici. La Meccanica quantistica è stata sempre considerata come la migliore fonte di casualita, a causa della sua intrinseca natura probabilistica. Tuttavia il paradigma tipico impiegato per estrarre numeri casuali da un sistema quantistico assume che lo stato di detto sistema sia puro. Tale assunzione, in principio comporta una generazione in cui il risultato delle misure è complemente impredicibile secondo la legge di Born. Il problema principale tuttavia è che nelle implementazioni reali, come in un laboratorio o in qualche dispositivo commerciale, difficilmente è possibile creare uno stato quantico puro. Generalmente ciò che si ottiene è uno stato quantistico misto. Uno stato misto tuttavia potrebbe essere in qualche modo correlato con un altro sistema quantistico in possesso, eventualmente, di un avversario. Nel caso estremo di uno stato completamente misto, un generatore quantistico praticamente è equivalente ad un generatore che impiega un processo di fisica classica, che in principio è predicibile. Nel Capitolo, si mostrera quindi come sia necessario passare da un estimatore di casualita classico, come l' entropia minima classica H_ {min (Z) di una variabile casuale Z Z ad un estimatore che tenga conto di una informazione marginale EE di tipo quantistico, ovvero l'entropia minima condizionata H_{min(Z|E). La entropia minima condizionata è una quantita fondamentale perchè consente di derivare quale sia il minimo contenuto di bit casuali estraibili dal sistema, in presenza di uno stato non puro. Abbiamo ideato un protocollo efficace basato sul principio di indeterminazione entropica per la stima dell'entropia min-condizionale. In generale, il principio di indeterminazione entropico consente di prendere in considerazione le informazioni che sono condivise tra più parti in possesso di un sistema quantistico tri-partitico e, soprattutto, consente di stimare il limite all'informazione che un partito ha sullo stato del sistema, dopo che è stato misurato. Abbiamo adattato tale principio al caso bipartito in cui un utente Alice, AA, è dotato di un sistema quantistico che nel caso in studio ipotizziamo essere preparato dall'avversario stesso, Eve EE, e che quindi potrebbe essere con esso correlato. Quindi, teoricamente Eve potrebbe essere in grado di prevedere tutti i risultati delle misurazioni che Alice esegue sulla sua parte di sistema, cioè potrebbe avere una conoscenza massima della variabile casuale ZZ in cui si registrano i risultati delle misure nella base \mathcal{Z. Tuttavia mostreremo che se Alice casualmente misura il sistema in una base \mathcal{X massimamente complementare a \mathcal{Z, Alice può inferire un limite inferiore l'entropia per H_{min(Z|E). In questo modo per Alice, utilizzando tecniche della crittografia classeica, è possibile espandere un piccolo seme iniziale di casualita utilizzato per la scelta delle basi di misura, in una quantita molto maggiore di numeri sicuri. Presenteremo i risultati di una dimostrazione sperimentale del protocollo in cui sono stati prodotti numeri casuali che passano i più rigorosi test per la valutazione della casualita. Nel Capitolo 6, verra illustrato un sistema di generazione ultraveloce di numeri casuali per mezzo di variabili continue(CV) QRNG. Siccome numeri casuali genuini sono una preziosa risorsa sia per l'Information Technology classica che quella quantistica, è chiaro che per sostenere i flussi sempre crescenti di dati per la crittografia, è necessario mettere a punto generatori in grado di produrre streaming con rate da Gigabit o Terabit al secondo. In Letteratura sono riportati alcuni esempi di protocolli QRNG che potrebbero raggiungere tali limiti. In genere, questi si basano sulla misura dele quadrature del campo elettromagnetico che può essere considerato come un infinito sistema quantistico bosonico. Le quadrature del campo possono essere misurate con il cosiddetto sistema di rivelazione a omodina che, in linea di principio, può estrarre un segnale di rumore a banda infinita. Di conseguenza, la banda del segnale casuale viene ad essere limitata solo dalla banda passante dei dispositivi utilizzati per misurare. Siccome, rilevatori a fotodiodi lavorano comunemente nella banda delle decine dei GHz, se il segnale è campionato con un ADC sufficientemente veloce e con un elevato numero di bit di digitalizzazione, rate da Gigabit o Terabit sono facilmente raggiungibili. Tuttavia, come nel caso dei QRNG a variabili discrete, i protocolli che si hanno in Letteratura, non considerano adeguatamente la purezza dello stato quantistico da misurare. Nel L'idea è di estendere il protocollo a variabile discreta del capitolo precedente, al caso continuo. Mostreremo come nell'ambito CV, non solo sia abbia il problema della purezza dello stato ma anche il problema relativo alla precisione delle misure utilizzate su di esso. Proporremo e daremo i risultati sperimentali per un nuovo protocollo in grado di estrarre numeri casuali ad alto rate e con un elevato grado di sicurezza
    corecore